Latent Vector Weighting for Word Meaning in Context
نویسندگان
چکیده
This paper presents a novel method for the computation of word meaning in context. We make use of a factorization model in which words, together with their window-based context words and their dependency relations, are linked to latent dimensions. The factorization model allows us to determine which dimensions are important for a particular context, and adapt the dependency-based feature vector of the word accordingly. The evaluation on a lexical substitution task – carried out for both English and French – indicates that our approach is able to reach better results than state-of-the-art methods in lexical substitution, while at the same time providing more accurate meaning representations.
منابع مشابه
Measuring Distributional Similarity in Context
The computation of meaning similarity as operationalized by vector-based models has found widespread use in many tasks ranging from the acquisition of synonyms and paraphrases to word sense disambiguation and textual entailment. Vector-based models are typically directed at representing words in isolation and thus best suited for measuring similarity out of context. In his paper we propose a pr...
متن کاملTopic Models for Meaning Similarity in Context
Recent work on distributional methods for similarity focuses on using the context in which a target word occurs to derive context-sensitive similarity computations. In this paper we present a method for computing similarity which builds vector representations for words in context by modeling senses as latent variables in a large corpus. We apply this to the Lexical Substitution Task and we show...
متن کاملPii: S0364-0213(01)00034-9
In Latent Semantic Analysis (LSA) the meaning of a word is represented as a vector in a high-dimensional semantic space. Different meanings of a word or different senses of a word are not distinguished. Instead, word senses are appropriately modified as the word is used in different contexts. In N-VP sentences, the precise meaning of the verb phrase depends on the noun it is combined with. An a...
متن کاملCCG Categories for Distributional Semantic Models
For the last decade, distributional semantics has been an active area of research to address the problem of understanding the semantics of words in natural language. The core principal of the distributional semantic approach is that the linguistic context surrounding a given word, which is represented as a vector, provides important information about its meaning. In this paper we investigate th...
متن کاملSemantic Density Analysis: Comparing Word Meaning across Time and Phonetic Space
This paper presents a new statistical method for detecting and tracking changes in word meaning, based on Latent Semantic Analysis. By comparing the density of semantic vector clusters this method allows researchers to make statistical inferences on questions such as whether the meaning of a word changed across time or if a phonetic cluster is associated with a specific meaning. Possible applic...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011